About this Journal  |  Author Guidelines  |   Submit a Manuscript     

Asia-Pacific Journal of Neural Networks and Its Applications

Volume 1, No. 1, 2017, pp 15-20
http://dx.doi.org/10.21742/ajnnia.2017.1.1.03

Abstract



Correcting Method of Numerical errors of Soft-max and Crossentropy according to CNN’s Output value



    Jun Mo Jeong , Se Jin Choi , Chiyong Kim
    Seokyeong University, Korea

    Abstract

    In this paper, a method is proposed to correct numerical errors of resulting values that can occur when the cross-entropy function is used as a softmax loss function and ReLU as an activation function of the convolutional neural network. Since exiting Softmax and crossentropy functions include exponential and log operations, they have a problem of numerical errors, which produce a range of numbers such as convergence to zero or infinite divergence when a value of the final output layer becomes too large. In the present paper, the above problem is solved by transforming an equation without affecting the features of the existing softmax and cross-entropy functions. Thus, normal training can be achieved by correcting the numerical error problem regardless of a size of values in the final output layer.


 

Contact Us

  • PO Box 5074, Sandy Bay Tasmania 7005, Australia
  • Phone: +61 3 9028 5994